23 research outputs found

    Designing Wearable Personal Assistants for Surgeons: An Egocentric Approach

    Get PDF

    Magic Pointing for Eyewear Computers

    Get PDF

    A Body-and-Mind-Centric Approach to Wearable Personal Assistants

    Get PDF

    6th international workshop on pervasive eye tracking and mobile eye-based interaction

    Get PDF
    Previous work on eye tracking and eye-based human-computer interfaces mainly concentrated on making use of the eyes in traditional desktop settings. With the recent growth of interest in wearable computers, such as smartwatches, smart eyewears and low-cost mobile eye trackers, eye-based interaction techniques for mobile computing are becoming increasingly important. PETMEI 2016 focuses on the pervasive eye tracking paradigm as a trailblazer for mobile eye-based interaction to take eye tracking out into the wild, to mobile and pervasive settings. We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, and new applications for pervasive eye tracking in ubiquitous computing. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on pervasive eye tracking

    EyeSeeThrough:Unifying Tool Selection and Application in Virtual Environments

    Get PDF
    In 2D interfaces, actions are often represented by fixed tools arranged in menus, palettes, or dedicated parts of a screen, whereas 3D interfaces afford their arrangement at different depths relative to the user and the user can move them relative to each other. In this paper, we introduce EyeSeeThrough as a novel interaction technique that utilizes eye-tracking in VR. The user can apply an action to an intended object by visually aligning the object with the tool at the line-of-sight, and then issue a confirmation command. The underlying idea is to merge the two-step process of 1) selection of a mode in a menu and 2) applying it to a target, into one unified interaction. We present a user study where we compare the method to the baseline two-step selection. The results of our user study showed that our technique outperforms the two step selection in terms of speed and comfort. We further developed a prototype of a virtual living room to demonstrate the practicality of the proposed technique

    A wearable kids' health monitoring system on smartphone

    No full text

    A Wrist-Worn Thermohaptic Device for Graceful Interruption

    No full text
    Thermal haptics is a potential system output modality for wearable devices that promises to function at the periphery of human attention. When adequately combined with existing attention-governing mechanisms of the human mind, it could be used for interrupting the human agent at a time when the negative influence on the ongoing activity is minimal. In this article we present our self-mitigated interruption concept (essentially a symbiosis of artificial external stimuli tuned to existing human attention management mechanisms) and perform a pilot study laying the ground for using a wrist-worn thermohaptic actuator for self-mitigating interruption. We then develope a prototype and perform an insightful pilot study. We frame our empirical thermohaptic experimental work in terms of Peripheral Interaction concepts and show how this new approach to Human-Computer Interaction relates to the Context-Aware-systems-inspired approach “Egocentric Interaction” aimed at supporting the design of envisioned Wearable Personal Assistants intended to, among other things, help human perception and cognition with the management of interruptions
    corecore